Explicit Max Margin Input Feature Selection for Nonlinear SVM using Second Order Methods
نویسندگان
چکیده
Incorporating feature selection in nonlinear SVMs leads to a large and challenging nonconvex minimization problem, which can be prone to suboptimal solutions. We use a second order optimization method that utilizes eigenvalue information and is less likely to get stuck at suboptimal solutions. We devise an alternating optimization approach to tackle the problem efficiently, breaking it down into a convex subproblem, corresponding to SVM optimization, and a nonconvex subproblem for feature selection. Importantly, we show that a naive approach, which implicitly maximizes the margin, can be susceptible to saddle point solutions. We propose a new explicit margin maximization method, which overcomes the drawbacks of the implicit approach. To avoid being trapped at saddle points, we first introduce an auxiliary variable and perform alternating optimization in the extended space, and then subsequently, to improve solution quality, we relax the use of geometric margin and maximize the functional margin in the feature selection subproblem. We demonstrate how this technique can also be applied to 1-norm linear programming SVMs. Experiment results show the explicit margin approach works effectively in the presence of class noise and many irrelevant features, consistently outperforming leading filter, wrapper, and other embedded nonlinear feature selection methods.
منابع مشابه
Primal explicit max margin feature selection for nonlinear support vector machines
Embedding feature selection in nonlinear SVMs leads to a challenging non-convex minimization problem, which can be prone to suboptimal solutions. This paper develops an effective algorithm to directly solve the embedded feature selection primal problem. We use a trust-region method, which is better suited for non-convex optimization compared to line-search methods, and guarantees convergence to...
متن کاملMargin-based Feature Selection Techniques for Support Vector Machine Classification
Feature selection for classification working in high-dimensional feature spaces can improve generalization accuracy, reduce classifier complexity, and is also useful for identifying the important feature “markers”, e.g., biomarkers in a bioinformatics or biomedical context. For support vector machine (SVM) classification, a widely used feature selection technique is recursive feature eliminatio...
متن کاملFeature Weighting Using Margin and Radius Based Error Bound Optimization in SVMs
The Support Vector Machine error bound is a function of the margin and radius. Standard SVM algorithms maximize the margin within a given feature space, therefore the radius is fixed and thus ignored in the optimization. We propose an extension of the standard SVM optimization in which we also account for the radius in order to produce an even tighter error bound than what we get by controlling...
متن کاملEnhancing Kernel Maximum Margin Projection for Face Recognition
To efficiently deal with the face recognition problem, a novel face recognition algorithm based on enhancing kernel maximum margin projection(MMP) is proposed in this paper. The main contributions of this work are as follows. First, the nonlinear extension of MMP through kernel trick is adopted to capture the nonlinear structure of face images. Second, the kernel deformation technique is propos...
متن کاملSteel Consumption Forecasting Using Nonlinear Pattern Recognition Model Based on Self-Organizing Maps
Steel consumption is a critical factor affecting pricing decisions and a key element to achieve sustainable industrial development. Forecasting future trends of steel consumption based on analysis of nonlinear patterns using artificial intelligence (AI) techniques is the main purpose of this paper. Because there are several features affecting target variable which make the analysis of relations...
متن کامل